On Learnability, Complexity and Stability
نویسندگان
چکیده
We consider the fundamental question of learnability of a hypotheses class in the supervised learning setting and in the general learning setting introduced by Vladimir Vapnik. We survey classic results characterizing learnability in term of suitable notions of complexity, as well as more recent results that establish the connection between learnability and stability of a learning algorithm.
منابع مشابه
Stability Conditions for Online Learnability
Stability is a general notion that quantifies the sensitivity of a learning algorithm’s output to small change in the training dataset (e.g. deletion or replacement of a single training sample). Such conditions have recently been shown to be more powerful to characterize learnability in the general learning setting under i.i.d. samples where uniform convergence is not necessary for learnability...
متن کاملUniform Convergence, Stability and Learnability for Ranking Problems
Most studies were devoted to the design of efficient algorithms and the evaluation and application on diverse ranking problems, whereas few work has been paid to the theoretical studies on ranking learnability. In this paper, we study the relation between uniform convergence, stability and learnability of ranking. In contrast to supervised learning where the learnability is equivalent to unifor...
متن کاملLearnability and Stability in the General Learning Setting
We establish that stability is necessary and sufficient for learning, even in the General Learning Setting where uniform convergence conditions are not necessary for learning, and where learning might only be possible with a non-ERM learning rule. This goes beyond previous work on the relationship between stability and learnability, which focused on supervised classification and regression, whe...
متن کاملSufficient Conditions for Learnability and Upper Bounds on Sample Complexity 2 . 1 Learnability of a Binary - Valued Function Class
متن کامل
Learnability of Augmented Naive Bayes in Nonimal Domains
It is well-known that Naive Bayes can only represent linearly separable functions in binary domains. But the learnability of general Augmented Naive Bayes is open. Little work is done on the learnability of Bayesian networks in nominal domains, a general case of binary domains. This paper explores the learnability of Augmented Naive Bayes in nominal domains. We introduce a complexity measure fo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1303.5976 شماره
صفحات -
تاریخ انتشار 2013